CrysGNN: Distilling Pre-trained Knowledge to Enhance Property Prediction for Crystalline Materials
نویسندگان
چکیده
In recent years, graph neural network (GNN) based approaches have emerged as a powerful technique to encode complex topological structure of crystal materials in an enriched repre- sentation space. These models are often supervised nature and using the property-specific training data, learn relation- ship between different properties like formation energy, bandgap, bulk modulus, etc. Most these methods require huge amount property-tagged data train system which may not be available for prop- erties. However, there is availability with its chemical composition structural bonds. To leverage untapped this paper presents CrysGNN, new pre-trained GNN framework crystalline materials, captures both node level information graphs unla- belled material data. Further, we extract distilled knowledge from CrysGNN inject into state art erty predictors enhance their property prediction accuracy. We conduct extensive experiments show that model, all SOTA algo- rithms able outperform own vanilla version good margins. also observe distillation process provides significant improvement over conventional ap- proach finetuning model. will release model along large dataset 800K crys- tal carefully curated; so can plugged any existing upcoming
منابع مشابه
Distilling Knowledge from an Ensemble of Models for Punctuation Prediction
This paper proposes an approach to distill knowledge from an ensemble of models to a single deep neural network (DNN) student model for punctuation prediction. This approach makes the DNN student model mimic the behavior of the ensemble. The ensemble consists of three single models. Kullback-Leibler (KL) divergence is used to minimize the difference between the output distribution of the DNN st...
متن کاملDistilling Model Knowledge
Top-performing machine learning systems, such as deep neural networks, large ensembles and complex probabilistic graphical models, can be expensive to store, slow to evaluate and hard to integrate into larger systems. Ideally, we would like to replace such cumbersome models with simpler models that perform equally well. In this thesis, we study knowledge distillation, the idea of extracting the...
متن کاملDistilling Task Knowledge from How-To Communities
Knowledge graphs have become a fundamental asset for search engines. A fair amount of user queries seek information on problem-solving tasks such as building a fence or repairing a bicycle. However, knowledge graphs completely lack this kind of how-to knowledge. This paper presents a method for automatically constructing a formal knowledge base on tasks and task-solving steps, by tapping the co...
متن کاملHierarchical Clustering for Fuzzy Modeling of Materials Property Prediction
A simple and effective fuzzy clustering approach is presented for fuzzy modeling from industrial data. In this approach, fuzzy clustering is implemented in two phases: data compression by a self-organizing network, and fuzzy partitioning via fuzzy cmeans clustering associated with a proposed cluster validity measure. The approach is used to extract fuzzy models from data and find out the optima...
متن کاملfixed point property for banach algebras associated to locally compact groups
در این پایان نامه به بررسی خاصیت نقطه ثابت و خاصیت نقطه ثابت برای نیم گروههای برگشت پذیر چپ روی بعضی جبرهای باناخ از جمله جبر فوریه و جبر فوریه استیلتیس پرداخته شده است. برای مثال بیان شده است که اگر گروه یک گروه فشرده موضعی با همسایگی فشرده برای عنصر همانی که تحت درونریختی ها پایاست باشد آنگاه جبر فوریه و جبر فوریه استیلتیس دارای خاصیت نقطه ثابت برای نیم گروه های برگشت پذیر چپ است اگر و تنها ا...
15 صفحه اولذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence
سال: 2023
ISSN: ['2159-5399', '2374-3468']
DOI: https://doi.org/10.1609/aaai.v37i6.25892